2,556 research outputs found

    Performance Models for Data Transfers: A Case Study with Molecular Chemistry Kernels

    Get PDF
    With increasing complexity of hardwares, systems with different memory nodes are ubiquitous in High Performance Computing (HPC). It is paramount to develop strategies to overlap the data transfers between memory nodes with computations in order to exploit the full potential of these systems. In this article, we consider the problem of deciding the order of data transfers between two memory nodes for a set of independent tasks with the objective to minimize the makespan. We prove that with limited memory capacity, obtaining the optimal order of data transfers is a NP-complete problem. We propose several heuristics for this problem and provide details about their favorable situations. We present an analysis of our heuristics on traces, obtained by running 2 molecular chemistry kernels, namely, Hartree-Fock (HF) and Coupled Cluster Single Double (CCSD) on 10 nodes of an HPC system. Our results show that some of our heuristics achieve significant overlap for moderate memory capacities and are very close to the lower bound of makespan

    Optimizing egalitarian performance in the side-effects model of colocation for data center resource management

    Full text link
    In data centers, up to dozens of tasks are colocated on a single physical machine. Machines are used more efficiently, but tasks' performance deteriorates, as colocated tasks compete for shared resources. As tasks are heterogeneous, the resulting performance dependencies are complex. In our previous work [18] we proposed a new combinatorial optimization model that uses two parameters of a task - its size and its type - to characterize how a task influences the performance of other tasks allocated to the same machine. In this paper, we study the egalitarian optimization goal: maximizing the worst-off performance. This problem generalizes the classic makespan minimization on multiple processors (P||Cmax). We prove that polynomially-solvable variants of multiprocessor scheduling are NP-hard and hard to approximate when the number of types is not constant. For a constant number of types, we propose a PTAS, a fast approximation algorithm, and a series of heuristics. We simulate the algorithms on instances derived from a trace of one of Google clusters. Algorithms aware of jobs' types lead to better performance compared with algorithms solving P||Cmax. The notion of type enables us to model degeneration of performance caused by using standard combinatorial optimization methods. Types add a layer of additional complexity. However, our results - approximation algorithms and good average-case performance - show that types can be handled efficiently.Comment: Author's version of a paper published in Euro-Par 2017 Proceedings, extends the published paper with addtional results and proof

    Modeling the mobility of living organisms in heterogeneous landscapes: Does memory improve foraging success?

    Full text link
    Thanks to recent technological advances, it is now possible to track with an unprecedented precision and for long periods of time the movement patterns of many living organisms in their habitat. The increasing amount of data available on single trajectories offers the possibility of understanding how animals move and of testing basic movement models. Random walks have long represented the main description for micro-organisms and have also been useful to understand the foraging behaviour of large animals. Nevertheless, most vertebrates, in particular humans and other primates, rely on sophisticated cognitive tools such as spatial maps, episodic memory and travel cost discounting. These properties call for other modeling approaches of mobility patterns. We propose a foraging framework where a learning mobile agent uses a combination of memory-based and random steps. We investigate how advantageous it is to use memory for exploiting resources in heterogeneous and changing environments. An adequate balance of determinism and random exploration is found to maximize the foraging efficiency and to generate trajectories with an intricate spatio-temporal order. Based on this approach, we propose some tools for analysing the non-random nature of mobility patterns in general.Comment: 14 pages, 4 figures, improved discussio

    1-0-alkyl-2-acetyl-sn-glycero-3-phosphorylcholine (Platelet Activating Factor): Interactions with Sheep Platelets

    Get PDF
    Platelet Activating Factor (PAF) seems to link platelet function with allergy and inflammation. Early events associated with the action of PAF on platelets are poorly understood, but PAF appears to interact with a platelet membrane receptor. Two photoaffinity-labelling analogs of PAF, diazoacetyl PAF and trifluorodiazoacetyl PAF, were designed for examining the interaction between PAF and its platelet receptor. These photoactive compounds were selected on the basis of the structure-activity relationship of PAF, and the general guidelines of the photoaffinity labelling technique. Some progress has been made in their synthesis and characterization. The standard platelet aggregation assay was unreliable in testing the activity of PAF analogs on sheep platelets. The response of sheep platelets to PAF was characterized in terms of aggregation, serotonin release, ATP release and phosphatidic acid formation but also with a new and more satisfactory method based on platelet shape change. The development of the latter was based on microscopic and rheooptic observations but is also supported by theoretical considerations. A shape change parameter (SCP) was defined and, therefore, the method is called the SCP assay. The SCP assay demonstrated, for the first time, that platelet shape change is a response sensitive to small changes in nanomolar PAF concentrations (less than an order of magnitude). The SCP assay proved superior to the aggregation assay for the characterization of the photoaffinity-labelling analogs in a comparison of sensitivity to PAF, PAF analogs and modulators of platelet activation. A wider range of PAF concentrations could be distinguished with the SCP assay, and it was more sensitive to structural differences between lyso PAF, PAF, propanoyl PAF, butanoyl PAF, hexanoyl PAF and oleoyl PAF than the aggregation assay. The order of relative activities observed with analogs did not differ between the assays. The SCP assay extended the range of detectable PAF levels to concentrations lower than those measurable by aggregation, serotonin release or ATP release. Modulators of platelet activation such as prostacyclin, TMB-8 and indomethacin, had similar effects on the magnitude of response by aggregation and SCP assay. Both ATP release and SCP were inhibited in proportion to the concentration of prostacyclin. Neither aggregation nor SCP was affected by indomethacin at concentrations that inhibit the response to thrombin and collagen, supporting the definition of PAF as mediator of a third pathway of platelet activation. The relationships of shape change and platelet aggregation to platelet activation suggest that the new assay will be of more value than the aggregation assay in characterizing the initial interaction of PAF with platelets

    Supplémentation en fer : indications, limites et modalités

    Get PDF
    During the past 10 years, the knowledge of iron metabolism has been revolutionized by the discovery of the main regulatory hormone of body iron: hepcidin. Meanwhile, new formulations of intravenous iron have been developed and are already or readily available. In this article, we review the recent pathophysiological mechanisms underlying anemia of chronic disease or due to iron deficiency. We describe the various treatment modalities of iron deficiency anemia using oral or intravenous route and the emerging indications of treatment with iron. Finally, we discuss the situations in which iron supplementation may be harmful

    LU Decomposition on Cell Broadband Engine: An Empirical Study to Exploit Heterogeneous Chip Multiprocessors

    Get PDF
    To meet the needs of high performance computing, the Cell Broadband Engine owns many features that differ from traditional processors, such as the large number of synergistic processor elements, large register files, the ability to hide main-storage latency with concurrent computation and DMA transfers. The exploitation of those features requires the programmer to carefully tailor programs and simutaneously deal with various performance factors, including locality, load balance, communication overhead, and multi-level parallelism. These factors, unfortunately, are dependent on each other; an optimization that enhances one factor may degrade another. This paper presents our experience on optimizing LU decomposition, one of the commonly used algebra kernels in scientific computing, on Cell Broadband Engine. The optimizations exploit task-level, data-level, and communication-level parallelism. We study the effects of different task distribution strategies, prefetch, and software cache, and explore the tradeoff among different performance factors, stressing the interactions between different optimizations. This work offers some insights in the optimizations on heterogenous multi-core processors, including the selection of programming models, considerations in task distribution, and the holistic perspective required in optimizations

    OStrich: Fair Scheduling for Multiple Submissions

    Get PDF
    International audienceCampaign Scheduling is characterized by multiple job submissions issued from multiple users over time. This model perfectly suits today's systems since most available parallel environments have multiple users sharing a common infrastructure. When scheduling individually the jobs submitted by various users, one crucial issue is to ensure fairness. This work presents a new fair scheduling algorithm called OStrich whose principle is to maintain a virtual time-sharing schedule in which the same amount of processors is assigned to each user. The completion times in the virtual schedule determine the execution order on the physical processors. Then, the campaigns are interleaved in a fair way by OStrich. For independent sequential jobs, we show that OStrich guarantees the stretch of a campaign to be proportional to campaign's size and the total number of users. The stretch is used for measuring by what factor a workload is slowed down relative to the time it takes on an unloaded system. The theoretical performance of our solution is assessed by simulating OStrich compared to the classical FCFS algorithm, issued from synthetic workload traces generated by two different user profiles. This is done to demonstrate how OStrich benefits both types of users, in contrast to FCFS

    Helicopter and ground emergency medical services transportation to hospital after major trauma in England: a comparative cohort study

    Get PDF
    Background: The utilization of helicopter emergency medical services (HEMS) in modern trauma systems has been a source of debate for many years. This study set to establish the true impact of HEMS in England on survival for patients with major trauma. Methods: A comparative cohort design using prospectively recorded data from the UK Trauma Audit and Research Network registry. 279 107 patients were identified between January 2012 and March 2017. The primary outcome measure was risk adjusted in-hospital mortality within propensity score matched cohorts using logistic regression analysis. Subset analyses were performed for subjects with prehospital Glasgow Coma Scale 29 and systolic blood pressure <90. Results: The analysis was based on 61 733 adult patients directly admitted to major trauma centers: 54 185 ground emergency medical services (GEMS) and 7548 HEMS. HEMS patients were more likely male, younger, more severely injured, more likely to be victims of road traffic collisions and intubated at scene. Crude mortality was higher for HEMS patients. Logistic regression demonstrated a 15% reduction in the risk adjusted odds of death (OR=0.846; 95% CI 0.684 to 1.046) in favor of HEMS. When analyzed for patients previously noted to benefit most from HEMS, the odds of death were reduced further but remained statistically consistent with no effect. Sensitivity analysis on 5685 patients attended by a doctor on scene but transported by GEMS demonstrated a protective effect on mortality versus the standard GEMS response (OR 0.77; 95% CI 0.62 to 0.95). Discussion: This prospective, level 3 cohort analysis demonstrates a non-significant survival advantage for patients transported by HEMS versus GEMS. Despite the large size of the cohort, the intrinsic mismatch in patient demographics limits the ability to statistically assess HEMS true benefit. It does, however, demonstrate an improved survival for patients attended by doctors on scene in addition to the GEMS response. Improvements in prehospital data and increased trauma unit reporting are required to accurately assess HEMS clinical and cost-effectiveness

    Applying the natural capital approach to decision making for the marine environment

    Get PDF
    The aspirations for natural capital and ecosystem service approaches to support environmental decision-making have not been fully realised in terms of their actual application in policy and management contexts. Application of the natural capital approach requires a range of methods, which as yet have not been fully tested in the context of decision making for the marine environment. It is unlikely that existing methodologies, which were developed for terrestrial systems and are based on land cover assessment approaches, will ever be feasible in the marine context at the national scale. Land cover approaches are also fundamentally insufficient for the marine environment because they do not take account of the water column, the significant interconnections between spatially disparate components, or the highly dynamic nature of the marine ecosystem, for example the high spatial mobility of many species. Data gaps have been a significant impediment to progress, so alternative methods that use proxies for quality information as well as the opportunities for remote sensing should be explored further. Greater effort to develop methodologies specifically for the marine environment is required, which should be interdisciplinary and cross-sectoral, coherent across policy areas, and applicable across a range of contexts
    • …
    corecore